Overcoming Catastrophic Forgetting by Incremental Moment Matching

نویسندگان

  • Sang-Woo Lee
  • Jin-Hwa Kim
  • Jaehyun Jun
  • Jung-Woo Ha
  • Byoung-Tak Zhang
چکیده

Catastrophic forgetting is a problem of neural networks that loses the information of the first task after training the second task. Here, we propose a method, i.e. incremental moment matching (IMM), to resolve this problem. IMM incrementally matches the moment of the posterior distribution of the neural network which is trained on the first and the second task, respectively. To make the search space of posterior parameter smooth, the IMM procedure is complemented by various transfer learning techniques including weight transfer, L2-norm of the old and the new parameter, and a variant of dropout with the old parameter. We analyze our approach on a variety of datasets including the MNIST, CIFAR-10, Caltech-UCSDBirds, and Lifelog datasets. The experimental results show that IMM achieves state-of-the-art performance by balancing the information between an old and a new network.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Ensemble of SVMs for Incremental Learning

Support Vector Machines (SVMs) have been successfully applied to solve a large number of classification and regression problems. However, SVMs suffer from the catastrophic forgetting phenomenon, which results in loss of previously learned information. Learn have recently been introduced as an incremental learning algorithm. The strength of Learn lies in its ability to learn new data without for...

متن کامل

A Strategy for an Uncompromising Incremental Learner

Multi-class supervised learning systems require the knowledge of the entire range of labels they predict. Often when learnt incrementally, they suffer from catastrophic forgetting. To avoid this, generous leeways have to be made to the philosophy of incremental learning that either forces a part of the machine to not learn, or to retrain the machine again with a selection of the historic data. ...

متن کامل

Selective Experience Replay for Lifelong Learning

Deep reinforcement learning has emerged as a powerful tool for a variety of learning tasks, however deep nets typically exhibit forgetting when learning multiple tasks in sequence. To mitigate forgetting, we propose an experience replay process that augments the standard FIFO buffer and selectively stores experiences in a long-term memory. We explore four strategies for selecting which experien...

متن کامل

Overcoming catastrophic forgetting with hard attention to the task

Catastrophic forgetting occurs when a neural network loses the information learned in a previous task after training on subsequent tasks. This problem remains a hurdle for artificial intelligence systems with sequential learning capabilities. In this paper, we propose a task-based hard attention mechanism that preserves previous tasks’ information without affecting the current task’s learning. ...

متن کامل

A palimpsest memory based on an incremental Bayesian learning rule

Capacity limited memory systems need to gradually forget old information in order to avoid catastrophic forgetting where all stored information is lost. This can be achieved by allowing new information to overwrite old, as in the so-called palimpsest memory. This paper describes a new such learning rule employed in an attractor neural network. The network does not exhibit catastrophic forgettin...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017